Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Additional quality metrics #1981

Merged
merged 19 commits into from
Sep 27, 2023

Conversation

alejoe91
Copy link
Member

@alejoe91 alejoe91 commented Sep 12, 2023

This PR adds the following new quality metrics, designed by @musall and slightly modified:

Firing range

A measure of the variation of firing rate across an experiment. Computed as the distance between the 5th-95th percentile of local firing rates computed on 10s intervals.

Amplitude CV

A measure of the amplitude distribution dispersion. The CV is computed over temporal bins estimated for each unit to contain an average_num_spikes_per_bin spikes. Both the medians and the range (percentile distance) are returned.

@alejoe91 alejoe91 added the qualitymetrics Related to qualitymetrics module label Sep 12, 2023
Copy link
Collaborator

@zm711 zm711 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

couple typos :)
And two clarifications.

doc/modules/qualitymetrics/amplitude_spread.rst Outdated Show resolved Hide resolved
doc/modules/qualitymetrics/amplitude_spread.rst Outdated Show resolved Hide resolved
doc/modules/qualitymetrics/firing_range.rst Outdated Show resolved Hide resolved
doc/modules/qualitymetrics/firing_range.rst Outdated Show resolved Hide resolved
doc/modules/qualitymetrics/firing_range.rst Outdated Show resolved Hide resolved
src/spikeinterface/qualitymetrics/misc_metrics.py Outdated Show resolved Hide resolved
src/spikeinterface/qualitymetrics/misc_metrics.py Outdated Show resolved Hide resolved
@musall
Copy link

musall commented Sep 12, 2023

Thank you for the improved implementation!
Looks good to me and the added normalization should make it more generalizable.

@alejoe91
Copy link
Member Author

Ready for final revision! there was a bug with the percentile numbers (should be 0-100, not 0-1) which has now been fixed.

@@ -522,37 +523,41 @@ def compute_synchrony_metrics(waveform_extractor, synchrony_sizes=(2, 4, 8), **k
Based on concepts described in [Gruen]_
This code was adapted from `Elephant - Electrophysiology Analysis Toolkit <https://github.com/NeuralEnsemble/elephant/blob/master/elephant/spike_train_synchrony.py#L245>`_
"""
assert np.all(s > 1 for s in synchrony_sizes), "Synchrony sizes must be greater than 1"
assert np.all([s > 1 for s in synchrony_sizes]), "Synchrony sizes must be greater than 1"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Super minor but couldn't you just do
assert min(synchrony_sizes) > 1.

5.2 µs ± 23.8 ns per loop (mean ± std. dev. of 7 runs, 100,000 loops each) # np.all with list comprehension
135 ns ± 1.18 ns per loop (mean ± std. dev. of 7 runs, 10,000,000 loops each) # min with tuple/list
795 ns ± 2.93 ns per loop (mean ± std. dev. of 7 runs, 1,000,000 loops each) # min with np.array

But maybe I'm missing something?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes! Eheheh

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done!

@alejoe91 alejoe91 merged commit fe4c690 into SpikeInterface:main Sep 27, 2023
8 checks passed
@alejoe91 alejoe91 deleted the additional-quality-metrics branch October 17, 2023 08:21
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
qualitymetrics Related to qualitymetrics module
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants